recent

MIT Sloan reading list: 7 books from 2024

‘Energy poverty’ hits US residents more in the South and Southwest

To help improve the accuracy of generative AI, add speed bumps

Credit: Dominick Reuter

Ideas Made to Matter

Social Media

Sheryl Sandberg on Facebook's missteps and what comes next

By

The more they ask “Could we?” the more creative people become. But the more they ask “Should we?” the more ethical they become.

That was the message from Facebook chief operating officer Sheryl Sandberg at MIT’s 2018 commencement ceremony, held June 8 on campus.

Facebook in the past year has faced a series of privacy scandals and ethical questions, most notably when it was reported that consulting firm Cambridge Analytica had mined the personal data of millions of Facebook users and used it to influence voter opinion. Facebook has also been criticized for failing to contain the spread of fake news.

Sandberg admitted Facebook failed to recognize the risks and head them off, but said she is proud of the work the company is doing.

“It’s painful when you miss something — when you make the mistake of believing so much in the good you are seeing that you don’t see the bad,” Sandberg told graduates. “It’s hard when you know that you let people down.”

“Today, anyone with an internet connection can inspire millions with a single sentence or a single image,” she said. “This gives extraordinary power to the people who use it to do good — to march for equality, reignite the movement against sexual harassment, rally around things they care about, and be there for the people they want to be there for. But it also empowers those who seek to do harm. When everyone has a voice, some raise their voices in hatred. When everyone can share, some share lies. When everyone can organize, some organize against the things we value most.”

Sandberg said there are three ways those issues can be addressed: turn tail and retreat, push forward with a single-minded belief in technology, or “fight like hell to do all the good we can do with the understanding that what we build will be used by people — and people are capable of both beauty and cruelty.”

For her, it’s the third option, and she urged the graduates to be optimistic about the future, “to see that building technology that supports equality, democracy, truth, and kindness means looking around corners, and throwing up every possible roadblock against hate, violence, and deception.”

Technology needs to have a “human heartbeat,” she said. It’s no longer enough just to have a good idea, she said, we need to know when it’s time to stop a bad one. But that’s becoming increasingly difficult as technology evolves faster than the society in which it is deployed.

She described the work of former MIT faculty member David Baltimore, who helped convene a debate among lawyers, scientists, and doctors in the 1970s to discuss new gene editing technology. He recognizing the clear potential for misuse, and in the end, an ethical framework was created and the technology’s development carried on, leading to major medical breakthroughs.

“The larger challenge is one all of us here today must face. The role of technology in our lives is growing — and that means our relationship with technology is changing. We have to change, too. We have to recognize the full weight of our responsibilities,” Sandberg said.

Society can no longer passively observe the changes, Sandberg said. Creators have a duty of care to ensure technology is used responsibly.

“When even with the best of intentions you go astray — as many of us have — you have the responsibility to course correct,” she said.

“Our challenge now is to be clear-eyed optimists … to build technology that improves lives and gives voice to those who often have none, while preventing misuse,” Sandberg said. “To build teams that better reflect the world around us, in all its complexity and diversity. If we succeed, we will build technology that better serves not just some of us, but all of us.”

For more info Zach Church Editorial & Digital Media Director (617) 324-0804